13 research outputs found

    Circle detection on images using Learning Automata

    Full text link
    Circle detection over digital images has received considerable attention from the computer vision community over the last few years devoting a tremendous amount of research seeking for an optimal detector. This article presents an algorithm for the automatic detection of circular shapes from complicated and noisy images with no consideration of conventional Hough transform principles. The proposed algorithm is based on Learning Automata (LA) which is a probabilistic optimization method that explores an unknown random environment by progressively improving the performance via a reinforcement signal (objective function). The approach uses the encoding of three non-collinear points as a candidate circle over the edge image. A reinforcement signal (matching function) indicates if such candidate circles are actually present in the edge map. Guided by the values of such reinforcement signal, the probability set of the encoded candidate circles is modified through the LA algorithm so that they can fit to the actual circles on the edge map. Experimental results over several complex synthetic and natural images have validated the efficiency of the proposed technique regarding accuracy, speed and robustness.Comment: 26 Page

    Soundscape Indices: New Features for Classifying Beehive Audio Samples

    Get PDF
    As the study of honey bee health has gained attention in the biology community, researchers have looked for new, non-invasive methods to monitor the health status of the colony. Since the beehive sound alters when the colony is exposed to stressors, analysis of the acoustic response of the colony has been used as a method to identify the type of stressor, whether it is chemical, pest, or disease. So far, two feature sets have been successfully used for this kind of analysis, being these low-level signal features and Mel Frequency Cepstral Coefficients (MFCC). Here we propose using soundscape indices, developed initially to delineate acoustic diversity in ecosystems, as an alternative to now used features. In our study, we examine the beehive acoustic response to trichloromethane laced-air and blank air and compare the performance of all three feature sets to discern the colony's sound between the hive being exposed to the chemical and not. Our results show that sound indices overperform the alternative features sets on this task. Based on these findings, we consider sound indices to be a valid set of features for beehive sound analysis and present our results to call the attention of the community on this fact

    Automatic methods for long-term tracking and the detection and decoding of communication dances in honeybees

    Get PDF
    The honeybee waggle dance communication system is an intriguing example of abstract animal communication and has been investigated thoroughly throughout the last seven decades. Typically, observables such as waggle durations or body angles are extracted manually either directly from the observation hive or from video recordings to quantify properties of the dance and related behaviors. In recent years, biology has profited from automation, improving measurement precision, removing human bias, and accelerating data collection. We have developed technologies to track all individuals of a honeybee colony and to detect and decode communication dances automatically. In strong contrast to conventional approaches that focus on a small subset of the hive life, whether this regards time, space, or animal identity, our more inclusive system will help the understanding of the dance comprehensively in its spatial, temporal, and social context. In this contribution, we present full specifications of the recording setup and the software for automatic recognition of individually tagged bees and the decoding of dances. We discuss potential research directions that may benefit from the proposed automation. Lastly, to exemplify the power of the methodology, we show experimental data and respective analyses from a continuous, experimental recording of 9 weeks duration

    Ein Computer Vision System zur automatischen Analyse von sozialen Netzwerken in Bienenvölkern

    Get PDF
    This thesis describes the development and implementation of the BeesBook System, a computer vision based solution for the automatic detection and analysis of behavioral patterns of honey bee colonies at the individual and collective level. The behavioral analysis of honey bee colonies requires extensive data sets describing the behavior of individual colony members. These data sets must often be created manually - a time consuming and cumbersome activity. Consequently, behavioral data sets are usually restricted to small subsets of the colony’s life, whether this regards to time, space or animal identity. By automating the data acquisition process, the BeesBook system allows the supervision of a higher number of individuals during more extended periods of time, opening the door to more sophisticated, inclusive and significant studies. The BeesBook System uses unique binary markers attached to the bees to keep track of their position and identity via computer vision software. The markers’ flexible design allows the implementation of a diversity of error-correcting codes, depending on the study’s goals and the colony’s population size. The markers adapt to the bee’s thorax shape creating a surface that withstands heavy-duty activity in and outside of the hive. Three recording seasons were conducted during the summers of 2014, 2015, and 2016 to evaluate and improve the performance of the system components. Each season extended over a period of nine weeks and generated approx. 65 million images. Prior to the beginning of each season, all members of a bee colony were individually tagged and transferred to an observation hive. The activity inside the hive was recorded using an array of four high-resolution cameras and stored for later analysis on one of the complexes of the North-German Supercomputing Alliance. Communication dances were identified in real-time using a second set of cameras comprised of two webcams running at high frequency. During the off-season, the experimental design was optimized to ensure that the generated data better serve the target of the experiment. Stored images were processed using highly optimized computer vision software to obtain the position, orientation, and ID of every marked bee. These data are further processed to generate motion paths for the colony members, which, combined with data on the communication dances, constitute an unprecedented set of knowledge on the inner life of the honey bee colony. The information obtained through this system establishes the conditions for consolidating our understanding of already known behaviors. Furthermore, this research has identified previously unknown behavioral data which ultimately extend our knowledge of bee colonies and their collective intelligence.Diese Dissertation beschreibt die Entwicklung und Implementierung eines BeesBook Systems,welches ein Bildverarbeitungsverfahren zur automatisierten Erkennung und Analyse des Verhaltens von Bienenstöcken auf der Ebene einzelner Individuen sowie des kollektiven Verhaltens ermöglicht. Verhaltensanalysen von Bienenpopulationen setzen umfangreiche Daten voraus, die das Verhalten einzelner Mitglieder der Population beschreiben. Diese Daten mĂŒssen in der Regel manuelle erzeugtwerden,welches eine zeitintensive und aufwĂ€ndige Aufgabe darstellt. Folglich waren Verhaltensdaten bisher nur auf kleine Teilbereiche (bezogen auf Zeit, Raum und Identifizierung der Bienen) des Populationslebens beschrĂ€nkt. Die automatisierte Datengewinnung des BeesBook Systems erlaubt es, eine hohe Anzahl von Individuen ĂŒber lĂ€ngere ZeitrĂ€ume zu beobachten, woraus sich zahlreiche Möglichkeiten fĂŒr umfassende und inklusive Untersuchungen ergeben. Das BeesBook System verwendet eindeutige, binĂ€re Markierungen,um die Position und IdentitĂ€t einzelner Individuen mit Hilfe von Bildverarbeitungssoftware zu bestimmen. AbhĂ€ngig von der PopulationsgrĂ¶ĂŸe und den Zielen der Untersuchung erlaubt dieses flexible Design der Markierungen die Implementierung vielfĂ€ltiger fehlerkorrigierender Codes. Die an die Thoraxform der Biene angepassten Markierungen bilden eine OberflĂ€che, die den durch die verschiedenen AktivitĂ€ten in- und außerhalb des Bienenstocks hervorgerufenen Belastungen standhĂ€lt. Um die Leistung der einzelnen Systemkomponenten bewerten und verbessern zu können, wurden insgesamt drei Experimente durchgefĂŒhrt. DieUntersuchungen wurden im Sommerder Jahre 2014, 2015 und 2016 durchgefĂŒhrt und dauerten jeweils neun Wochen. Insgesamt wurden ca. 65 Millionen Bilder aufgenommen. Vor Beginn der jeweiligen Untersuchung wurde jedes Mitglied der Bienenpopulation markiert und in einen Beobachtungsstock ĂŒberfĂŒhrt. Die AktivitĂ€ten innerhalb des Bienenstocks wurden mit vier hochauflösenden Kameras aufgenommen. Die so erzeugten Daten wurden auf einem Komplex des norddeutschen Verbundes fĂŒr Hoch- und Höchstleistungsrechnen gespeichert. Die SchwĂ€nzeltĂ€nze wurden in Echtzeit mit einem zweiten Set von Kameras identifiziert, welches aus zwei Hochgeschwindigkeits-Webcams bestand. WĂ€hrend der drei UntersuchungszeitrĂ€ume wurde das experimentelle Design hinsichtlich der Eignung der erzeugten Daten zur Analyse des kollektiven Verhaltens optimiert. Um die Position, Orientierung und ID jeder markierten Biene zu erfassen, wurden die gespeicherten Bilder unter Zuhilfenahme optimierter Bildverarbeitungssoftware verarbeitet. Anschließend wurden diese Daten weiterverarbeitet, um Bewegungspfade zu erzeugen, welche in Kombination mit den Informationen der SchwĂ€nzeltĂ€nze ein neuartige Einblicke in das Innenleben eines Bienenstocks erlauben. Die durch dieses System gewonnen Informationen ermöglicht es bereits bestehende Erkenntnisse Bienenverhalten zu validieren. DarĂŒber hinaus hat diese Forschungsarbeit bisher unbekannte Verhaltensdaten erzeugt, die letztendlich unser VerstĂ€ndnis von Bienenstöcken und seinen Schwarmintelligenz erweitern kann

    Automatic detection and decoding of honey bee waggle dances.

    Get PDF
    The waggle dance is one of the most popular examples of animal communication. Forager bees direct their nestmates to profitable resources via a complex motor display. Essentially, the dance encodes the polar coordinates to the resource in the field. Unemployed foragers follow the dancer's movements and then search for the advertised spots in the field. Throughout the last decades, biologists have employed different techniques to measure key characteristics of the waggle dance and decode the information it conveys. Early techniques involved the use of protractors and stopwatches to measure the dance orientation and duration directly from the observation hive. Recent approaches employ digital video recordings and manual measurements on screen. However, manual approaches are very time-consuming. Most studies, therefore, regard only small numbers of animals in short periods of time. We have developed a system capable of automatically detecting, decoding and mapping communication dances in real-time. In this paper, we describe our recording setup, the image processing steps performed for dance detection and decoding and an algorithm to map dances to the field. The proposed system performs with a detection accuracy of 90.07%. The decoded waggle orientation has an average error of -2.92° (± 7.37°), well within the range of human error. To evaluate and exemplify the system's performance, a group of bees was trained to an artificial feeder, and all dances in the colony were automatically detected, decoded and mapped. The system presented here is the first of this kind made publicly available, including source code and hardware specifications. We hope this will foster quantitative analyses of the honey bee waggle dance

    Tracking All Members of a Honey Bee Colony Over Their Lifetime Using Learned Models of Correspondence

    Get PDF
    Computational approaches to the analysis of collective behavior in social insects increasingly rely on motion paths as an intermediate data layer from which one can infer individual behaviors or social interactions. Honey bees are a popular model for learning and memory. Previous experience has been shown to affect and modulate future social interactions. So far, no lifetime history observations have been reported for all bees of a colony. In a previous work we introduced a recording setup customized to track up to 4,000 marked bees over several weeks. Due to detection and decoding errors of the bee markers, linking the correct correspondences through time is non-trivial. In this contribution we present an in-depth description of the underlying multi-step algorithm which produces motion paths, and also improves the marker decoding accuracy significantly. The proposed solution employs two classifiers to predict the correspondence of two consecutive detections in the first step, and two tracklets in the second. We automatically tracked ~2,000 marked honey bees over 10 weeks with inexpensive recording hardware using markers without any error correction bits. We found that the proposed two-step tracking reduced incorrect ID decodings from initially ~13% to around 2% post-tracking. Alongside this paper, we publish the first trajectory dataset for all bees in a colony, extracted from ~3 million images covering 3 days. We invite researchers to join the collective scientific effort to investigate this intriguing animal system. All components of our system are open-source

    Difference image and its Fourier transformation.

    No full text
    <p>(A) The image resulting from subtracting consecutive video frames of waggling bees exhibits a characteristic Gabor filter-like pattern. (B) While the peak location varies in image space along with the dancer’s position, its representation in the Fourier space is location-independent, showing distinctive peaks at frequencies related to the size and distance of the Gabor-like pattern.</p

    Fundamental parameters.

    No full text
    <p>Knowing the starting time (<i>t</i><sub><i>x</i></sub>) and duration (<i>d</i><sub><i>wx</i></sub>) for each waggle run, it is possible to calculate the return run durations as the time gaps between consecutive waggle runs.</p
    corecore